# Whole Word Masking Training
Bert Base Japanese Char Whole Word Masking
A BERT model pre-trained on Japanese text using character-level tokenization and whole word masking techniques, suitable for Japanese natural language processing tasks.
Large Language Model Japanese
B
tohoku-nlp
1,724
4
Bert Base Parsbert Uncased
Persian language understanding model based on Transformer architecture, outperforming multilingual BERT and other hybrid models
Large Language Model
B
HooshvareLab
99.81k
38
Bert Base Parsbert Ner Uncased
Apache-2.0
Transformer-based Persian language understanding model, optimized for Persian Named Entity Recognition (NER) tasks
Sequence Labeling Other
B
HooshvareLab
6,130
5
Bert Base Parsbert Armanner Uncased
Apache-2.0
ParsBERT is a Persian language understanding model based on the BERT architecture, specifically designed for Persian named entity recognition tasks.
Sequence Labeling Other
B
HooshvareLab
120
3
Bert Base Parsbert Peymaner Uncased
Apache-2.0
ParsBERT is a Persian language understanding model based on the BERT architecture, specifically optimized for Persian Named Entity Recognition (NER) tasks.
Sequence Labeling Other
B
HooshvareLab
40
0
Featured Recommended AI Models